Generalization of Information Measures

نویسندگان

  • Po-Ning Chen
  • Fady Alajaji
چکیده

| General formulas for entropy, mutual information, and divergence are established. It is revealed that these quantities are actually determined by three decisive sequences of random variables; which are, respectively, the normalized source information density, the normalized channel information density, and the normalized log-likelihood ratio. In terms of the ultimate cumulative distribution functions or spectrums of these random sequences, entropy, mutual information and divergence are respectively expressed in their most general form. In light of the newly dened quantities, general data compaction and data compression (source coding) theorems for block codes, and the Neyman-Pearson type-II error exponent subject to upper bounds on the type-I error probability are derived.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A research on classification performance of fuzzy classifiers based on fuzzy set theory

Due to the complexities of objects and the vagueness of the human mind, it has attracted considerable attention from researchers studying fuzzy classification algorithms. In this paper, we propose a concept of fuzzy relative entropy to measure the divergence between two fuzzy sets. Applying fuzzy relative entropy, we prove the conclusion that patterns with high fuzziness are close to the classi...

متن کامل

Existence of solutions of infinite systems of integral equations in the Frechet spaces

In this paper we apply the technique of measures of noncompactness to the theory of infinite system of integral equations in the Fr´echet spaces. Our aim is to provide a few generalization of Tychonoff fixed point theorem and prove the existence of solutions for infinite systems of nonlinear integral equations with help of the technique of measures of noncompactness and a generalization of Tych...

متن کامل

Entropy of infinite systems and transformations

The Kolmogorov-Sinai entropy is a far reaching dynamical generalization of Shannon entropy of information systems. This entropy works perfectly for probability measure preserving (p.m.p.) transformations. However, it is not useful when there is no finite invariant measure. There are certain successful extensions of the notion of entropy to infinite measure spaces, or transformations with ...

متن کامل

On a p-Laplacian system and a generalization of the Landesman-Lazer type condition

This article shows the existence of weak solutions of a resonance problem for nonuniformly p-Laplacian system in a bounded domain in $mathbb{R}^N$‎. ‎Our arguments are based on the minimum principle and rely on a generalization of the Landesman-Lazer type condition‎.

متن کامل

Totally probabilistic Lp spaces

In this paper, we introduce the notion of probabilistic valued measures as a generalization of non-negative measures and construct the corresponding Lp spaces, for distributions p > "0. It is alsoshown that if the distribution p satises p "1 then, as in the classical case, these spaces are completeprobabilistic normed spaces.

متن کامل

The associated measure on locally compact cocommutative KPC-hypergroups

We study harmonic analysis on cocommutative KPC-hyper-groups‎, which is a generalization of DJS-hypergroups‎, ‎introduced by Kalyuzhnyi‎, ‎Podkolzin and Chapovsky‎. ‎We prove that there is a relationship between‎ ‎the associated measures $mu$ and $gamma mu$‎, ‎where $mu$ is‎ ‎a Radon measure on KPC-hypergroup $Q$ and $gamma$ is a character on $Q$.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017